Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Some of the basic properties of any dynamical system can be summarized by a graph. The dynamical systems in our theory run from maps like the logistic map to ordinary differential equations to dissipative partial differential equations. Our goal has been to define a meaningful concept of graph of any dynamical system. As a result, we base our definition of “chain graph” on “epsilon-chains”, defining both nodes and edges of the graph in terms of chains. In particular, nodes are often maximal limit sets and there is an edge between two nodes if there is a trajectory whose forward limit set is in one node and its backward limit set is in the other. Our initial goal was to prove that every “chain graph” of a dynamical system is, in some sense, connected, and we prove connectedness under mild hypotheses.more » « lessFree, publicly-accessible full text available July 2, 2026
-
While studying gradient dynamical systems, Morse introduced the idea of encoding the qualitative behavior of a dynamical system into a graph. Smale later refined Morse’s idea and extended it to Axiom-A diffeomorphisms on manifolds. In Smale’s vision, nodes are indecomposable closed invariant subsets of the non-wandering set with a dense orbit and there is an edge from node M to node N (we say that N is downstream from M) if the unstable manifold of M intersects the stable manifold of N. Since then, the decomposition of the non-wandering set was studied in many other settings, while the edges component of Smale’s construction has been often overlooked. In the same years, more sophisticated generalizations of the non-wandering set, introduced by Birkhoff in 1920s, were elaborated first by Auslander in early 1960s, by Conley in early 1970s and later by Easton and other authors. In our language, each of these generalizations involves the introduction of a closed and transitive extension of the prolongational relation, that is closed but not transitive. In the present article, we develop a theory that generalizes at the same time both these lines of research. We study the general properties of closed transitive relations (which we call streams) containing the space of orbits of a discrete-time or continuous-time semi-flow and we argue that these relations play a central role in the qualitative study of dynamical systems. All most studied concepts of recurrence currently in literature can be defined in terms of our streams. Finally, we show how to associate to each stream a graph encoding its qualitative properties. Our main general result is that each stream of a semi-flow with “compact dynamics” has a connected graph. The range of semi-flows covered by our theorem goes from 1-dimensional discrete-time systems like the logistic map up to infinite-dimensional continuous-time systems like the semi-flow of quasilinear parabolic reaction–diffusion partial differential equations.more » « less
-
null (Ed.)Convolution is a central operation in Convolutional Neural Networks (CNNs), which applies a kernel to overlapping regions shifted across the image. However, because of the strong correlations in real-world image data, convolutional kernels are in effect re-learning redundant data. In this work, we show that this redundancy has made neural network training challenging, and propose network deconvolution, a procedure which optimally removes pixel-wise and channel-wise correlations before the data is fed into each layer. Network deconvolution can be efficiently calculated at a fraction of the computational cost of a convolution layer. We also show that the deconvolution filters in the first layer of the network resemble the center-surround structure found in biological neurons in the visual regions of the brain. Filtering with such kernels results in a sparse representation, a desired property that has been missing in the training of neural networks. Learning from the sparse representation promotes faster convergence and superior results without the use of batch normalization. We apply our network deconvolution operation to 10 modern neural network models by replacing batch normalization within each. Extensive experiments show that the network deconvolution operation is able to deliver performance improvement in all cases on the CIFAR-10, CIFAR-100, MNIST, Fashion-MNIST, Cityscapes, and ImageNet datasets.more » « less
An official website of the United States government

Full Text Available